69 research outputs found

    The Krigifier: A Procedure for Generating Pseudorandom Nonlinear Objective Functions for Computational Experimentation

    Get PDF
    Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS

    Parallel Deterministic and Stochastic Global Minimization of Functions with Very Many Minima

    Get PDF
    The optimization of three problems with high dimensionality and many local minima are investigated under five different optimization algorithms: DIRECT, simulated annealing, Spall’s SPSA algorithm, the KNITRO package, and QNSTOP, a new algorithm developed at Indiana University

    Adjusting process count on demand for petascale global optimization⋆

    Get PDF
    There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, the modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed

    The Cost of Numerical Integration in Statistical Decision-theoretic Methods for Robust Design Optimization

    Get PDF
    The Bayes principle from statistical decision theory provides a conceptual framework for quantifying uncertainties that arise in robust design optimization. The difficulty with exploiting this framework is computational, as it leads to objective and constraint functions that must be evaluated by numerical integration. Using a prototypical robust design optimization problem, this study explores the computational cost of multidimensional integration (computing expectation) and its interplay with optimization algorithms. It concludes that straightforward application of standard off-the-shelf optimization software to robust design is prohibitively expensive, necessitating adaptive strategies and the use of surrogates

    Molecular Embedding via a Second Order Dissimilarity Parameterized Approach

    Full text link

    Note On The Effectiveness OF Stochastic Optimization Algorithms For Robust Design

    Get PDF
    Robust design optimization (RDO) uses statistical decision theory and optimization techniques to optimize a design over a range of uncertainty (introduced by the manufacturing process and unintended uses). Since engineering ob jective functions tend to be costly to evaluate and prohibitively expensive to integrate (required within RDO), surrogates are introduced to allow the use of traditional optimization methods to ïŹnd solutions. This paper explores the suitability of radically diïŹ€erent (deterministic and stochastic) optimization methods to solve prototypical robust design problems. The algorithms include a genetic algorithm using a penalty function formulation, the simultaneous perturbation stochastic approximation (SPSA) method, and two gradient-based constrained nonlinear optimizers (method of feasible directions and sequential quadratic programming). The results show that the fully deterministic standard optimization algorithms are consistently more accurate, consistently more likely to terminate at feasible points, and consistently considerably less expensive than the fully nondeterministic algorithms
    • 

    corecore